skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Hakim, Gregory"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract “Online” data assimilation (DA) is used to generate a seasonal-resolution reanalysis dataset over the last millennium by combining forecasts from an ocean–atmosphere–sea-ice coupled linear inverse model with climate proxy records. Instrumental verification reveals that this reconstruction achieves the highest correlation skill, while using fewer proxies, in surface temperature reconstructions compared to other paleo-DA products, particularly during boreal winter when proxy data are scarce. Reconstructed ocean and sea-ice variables also have high correlation with instrumental and satellite datasets. Verification against independent proxy records shows that reconstruction skill is robust throughout the last millennium. Analysis of the results reveals that the method effectively captures the seasonal evolution and amplitude of El Niño events, seasonal temperature trends that are consistent with orbital forcing over the last millennium, and polar-amplified cooling in the transition from the Medieval Climate Anomaly to the Little Ice Age. 
    more » « less
    Free, publicly-accessible full text available September 11, 2026
  2. Abstract The development of deep learning (DL) weather forecasting models has made rapid progress and achieved comparable or better skill than traditional Numerical Weather prediction (NWP) models, which are generally computationally intensive. However, applications of these DL models have yet to be fully explored, including for severe convective events. We evaluate the DL model Pangu‐Weather in forecasting tornadic environments with one‐day lead times using convective available potential energy (CAPE), 0–6 bulk wind difference (BWD6), and 0–3 km storm‐relative helicity (SRH3). We also compare its performance to the National Centers for Environmental Prediction (NCEP)'s Global Forecast System (GFS), a traditional NWP model. Pangu‐Weather generally outperforms GFS in predicting BWD6 and SRH3 at the closest grid point and hour of the storm report. However, Pangu‐Weather tends to underpredict the maximum values of all convective parameters in the 1–2 hr before the storm across the surrounding grid points compared to the GFS. 
    more » « less
    Free, publicly-accessible full text available April 16, 2026
  3. Abstract A deep learning (DL) model, based on a transformer architecture, is trained on a climate‐model data set and compared with a standard linear inverse model (LIM) in the tropical Pacific. We show that the DL model produces more accurate forecasts compared to the LIM when tested on a reanalysis data set. We then assess the ability of an ensemble Kalman filter to reconstruct the monthly averaged upper ocean from a noisy set of 24 sea‐surface temperature observations designed to mimic existing coral proxy measurements, and compare results for the DL model and LIM. Due to signal damping in the DL model, we implement a novel inflation technique by adding noise from hindcast experiments. Results show that assimilating observations with the DL model yields better reconstructions than the LIM for observation averaging times ranging from 1 month to 1 year. The improved reconstruction is due to the enhanced predictive capabilities of the DL model, which map the memory of past observations to future assimilation times. 
    more » « less
    Free, publicly-accessible full text available November 1, 2025
  4. Abstract Paleoclimate reconstructions are increasingly central to climate assessments, placing recent and future variability in a broader historical context. Paleoclimate reconstructions are increasingly central to climate assessments, placing recent and future variability in a broader historical context. Several estimation methods produce plumes of climate trajectories that practitioners often want to compare to other reconstruction ensembles, or to deterministic trajectories produced by other means, such as global climate models. Of particular interest are “offline” data assimilation (DA) methods, which have recently been adapted to paleoclimatology. Offline DA lacks an explicit model connecting time instants, so its ensemble members are not true system trajectories. This obscures quantitative comparisons, particularly when considering the ensemble mean in isolation. We propose several resampling methods to introduce a priori constraints on temporal behavior, as well as a general notion, called plume distance, to carry out quantitative comparisons between collections of climate trajectories (“plumes”). The plume distance provides a norm in the same physical units as the variable of interest (e.g. °C for temperature), and lends itself to assessments of statistical significance. We apply these tools to four paleoclimate comparisons: (1) global mean surface temperature (GMST) in the online and offline versions of the Last Millennium Reanalysis (v2.1); (2) GMST from these two ensembles to simulations of the Paleoclimate Model Intercomparison Project past1000 ensemble; (3) LMRv2.1 to the PAGES 2k (2019) ensemble of GMST and (4) northern hemisphere mean surface temperature from LMR v2.1 to the Büntgen et al. (2021) ensemble. Results generally show more compatibility between these ensembles than is visually apparent. The proposed methodology is implemented in an open-source Python package, and we discuss possible applications of the plume distance framework beyond paleoclimatology. 
    more » « less
    Free, publicly-accessible full text available December 12, 2025
  5. Abstract Despite increased Atlantic hurricane risk, projected trends in hurricane frequency in the warming climate are still highly uncertain, mainly due to short instrumental record that limits our understanding of hurricane activity and its relationship to climate. Here we extend the record to the last millennium using two independent estimates: a reconstruction from sedimentary paleohurricane records and a statistical model of hurricane activity using sea surface temperatures (SSTs). We find statistically significant agreement between the two estimates and the late 20th century hurricane frequency is within the range seen over the past millennium. Numerical simulations using a hurricane-permitting climate model suggest that hurricane activity was likely driven by endogenous climate variability and linked to anomalous SSTs of warm Atlantic and cold Pacific. Volcanic eruptions can induce peaks in hurricane activity, but such peaks would likely be too weak to be detected in the proxy record due to large endogenous variability. 
    more » « less
    Free, publicly-accessible full text available December 1, 2025
  6. Abstract. Climate field reconstruction (CFR) refers to the estimation of spatiotemporal climate fields (such as surface temperature) from a collection of pointwise paleoclimate proxy datasets. Such reconstructions can provide rich information on climate dynamics and provide an out-of-sample validation of climate models. However, most CFR workflows are complex and time-consuming, as they involve (i) preprocessing of the proxy records, climate model simulations, and instrumental observations; (ii) application of one or more statistical methods; and (iii) analysis and visualization of the reconstruction results. Historically, this process has lacked transparency and accessibility, limiting reproducibility and experimentation by non-specialists. This article presents an open-source and object-oriented Python package called cfr that aims to make CFR workflows easy to understand and conduct, saving climatologists from technical details and facilitating efficient and reproducible research. cfr provides user-friendly utilities for common CFR tasks such as proxy and climate data analysis and visualization, proxy system modeling, and modularized workflows for multiple reconstruction methods, enabling methodological intercomparisons within the same framework. The package is supported with extensive documentation of the application programming interface (API) and a growing number of tutorial notebooks illustrating its usage. As an example, we present two cfr-driven reconstruction experiments using the PAGES 2k temperature database applying the last millennium reanalysis (LMR) paleoclimate data assimilation (PDA) framework and the graphical expectation–maximization (GraphEM) algorithm, respectively. 
    more » « less
  7. Here, we show that the Last Glacial Maximum (LGM) provides a stronger constraint on equilibrium climate sensitivity (ECS), the global warming from increasing greenhouse gases, after accounting for temperature patterns. Feedbacks governing ECS depend on spatial patterns of surface temperature (“pattern effects”); hence, using the LGM to constrain future warming requires quantifying how temperature patterns produce different feedbacks during LGM cooling versus modern-day warming. Combining data assimilation reconstructions with atmospheric models, we show that the climate is more sensitive to LGM forcing because ice sheets amplify extratropical cooling where feedbacks are destabilizing. Accounting for LGM pattern effects yields a median modern-day ECS of 2.4°C, 66% range 1.7° to 3.5°C (1.4° to 5.0°C, 5 to 95%), from LGM evidence alone. Combining the LGM with other lines of evidence, the best estimate becomes 2.9°C, 66% range 2.4° to 3.5°C (2.1° to 4.1°C, 5 to 95%), substantially narrowing uncertainty compared to recent assessments. 
    more » « less
  8. null (Ed.)
    Abstract Scientific understanding of low-frequency tropical Pacific variability, especially responses to perturbations in radiative forcing, suffers from short observational records, sparse proxy networks, and bias in model simulations. Here, we combine the strengths of proxies and models through coral-based paleoclimate data assimilation. We combine coral archives ( δ 18 O, Sr/Ca) with the dynamics, spatial teleconnections, and intervariable relationships of the CMIP5/PMIP3 Past1000 experiments using the Last Millennium Reanalysis data assimilation framework. This analysis creates skillful reconstructions of tropical Pacific temperatures over the observational era. However, during the period of intense volcanism in the early nineteenth century, southwestern Pacific corals produce El Niño–Southern Oscillation (ENSO) reconstructions that are of opposite sign from those from eastern Pacific corals and tree ring records. We systematically evaluate the source of this discrepancy using 1) single-proxy experiments, 2) varied proxy system models (PSMs), and 3) diverse covariance patterns from the Past1000 simulations. We find that individual proxy records and coral PSMs do not significantly contribute to the discrepancy. However, following major eruptions, the southwestern Pacific corals locally record more persistent cold anomalies than found in the Past1000 experiments and canonical ENSO teleconnections to the southwest Pacific strongly control the reconstruction response. Furthermore, using covariance patterns independent of ENSO yields reconstructions consistent with coral archives across the Pacific. These results show that model bias can strongly affect how proxy information is processed in paleoclimate data assimilation. As we illustrate here, model bias influences the magnitude and persistence of the response of the tropical Pacific to volcanic eruptions. 
    more » « less